- continuous random quantity
- Макаров: непрерывная случайная величина
Универсальный англо-русский словарь. Академик.ру. 2011.
Универсальный англо-русский словарь. Академик.ру. 2011.
Quantity — is a kind of property which exists as magnitude or multitude. It is among the basic classes of things along with quality, substance, change, and relation. Quantity was first introduced as quantum, an entity having quantity. Being a fundamental… … Wikipedia
Continuous-time Markov process — In probability theory, a continuous time Markov process is a stochastic process { X(t) : t ≥ 0 } that satisfies the Markov property and takes values from a set called the state space; it is the continuous time version of a Markov chain. The… … Wikipedia
random variable — Statistics. a quantity that takes any of a set of values with specified probabilities. Also called variate. [1935 40] * * * In statistics, a function that can take on either a finite number of values, each with an associated probability, or an… … Universalium
Convergence of random variables — In probability theory, there exist several different notions of convergence of random variables. The convergence of sequences of random variables to some limit random variable is an important concept in probability theory, and its applications to … Wikipedia
Timeline of thermodynamics, statistical mechanics, and random processes — A timeline of events related to thermodynamics, statistical mechanics, and random processes. Ancient times *c. 3000 BC The ancients viewed heat as that related to fire. The ancient Egyptians viewed heat as related to origin mythologies. One… … Wikipedia
Probability distribution — This article is about probability distribution. For generalized functions in mathematical analysis, see Distribution (mathematics). For other uses, see Distribution (disambiguation). In probability theory, a probability mass, probability density … Wikipedia
statistics — /steuh tis tiks/, n. 1. (used with a sing. v.) the science that deals with the collection, classification, analysis, and interpretation of numerical facts or data, and that, by use of mathematical theories of probability, imposes order and… … Universalium
Time — This article is about the measurement. For the magazine, see Time (magazine). For other uses, see Time (disambiguation). The flow of sand in an hourglass can be used to keep track of elapsed time. It also concretely represents the present as… … Wikipedia
Kullback–Leibler divergence — In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence, information gain, relative entropy, or KLIC) is a non symmetric measure of the difference between two probability distributions P … Wikipedia
amorphous solid — ▪ physics Introduction any noncrystalline solid in which the atoms and molecules are not organized in a definite lattice pattern. Such solids include glass, plastic, and gel. Solids and liquids (liquid) are both forms of condensed… … Universalium
Mutual information — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In probability theory and information theory, the mutual information (sometimes known by the archaic term… … Wikipedia